Learning meanings for sentences
نویسنده
چکیده
Consider an English sentence of length n, say “Cats like to chase mice” with n = 5. Suppose that we are given a binary tree representing the syntactic structure of the sentence. Each word is a leaf node of the tree, and there are n− 1 internal nodes. Each internal node covers a phrase of two or more consecutive words. We will associate a column vector in R with each node, to represent the meaning of the corresponding phrase. A typical value for the dimensionality d is 100. The meaning of each word is initialized to be a random vector in R. This means that we create a fixed lexicon containing a random vector for each word. Each random vector is generated independently from a Gaussian of dimension d with mean zero and diagonal covariance matrix σI . Each time the same word is used in any sentence, the same vector is used as its meaning. The meaning of a phrase is a function of the meanings of its two components. This is called a compositional approach to semantics. Let the node k have children i and j, whose meanings are xi and xj . The meaning of node k is
منابع مشابه
Grounded Language Learning from Video Described with Sentences
We present a method that learns representations for word meanings from short video clips paired with sentences. Unlike prior work on learning language from symbolic input, our input consists of video of people interacting with multiple complex objects in outdoor environments. Unlike prior computer-vision approaches that learn from videos with verb labels or images with noun labels, our labels a...
متن کاملWord learning in linguistic context: Processing and memory effects.
During language acquisition, children exploit syntactic cues within sentences to learn the meanings of words. Yet, it remains unknown how this strategy develops alongside an ability to access cues during real-time language comprehension. This study investigates how on-line sensitivity to syntactic cues impacts off-line interpretation and recall of word meanings. Adults and 5-year-olds heard nov...
متن کاملBootstrapping structure into language : alignment-based learning
. . . refined and abstract meanings largely grow out of more concrete meanings. — Bloomfield (1933) This thesis introduces a new unsupervised learning framework, called AlignmentBased Learning, which is based on the alignment of sentences and Harris’s (1951) notion of substitutability. Instances of the framework can be applied to an untagged, unstructured corpus of natural language sentences, r...
متن کاملThe Ups and Downs of Lexical Acquisition
We have implemented an incremental lexical acquisition mechanism that learns the meanings of previously unknown words from the context in which they appear, as a part of the process of parsing and semantically interpreting sentences. Implement at ion of this algorithm brought to light a fundamental difference between learning verbs and learning nouns. Specifically, because verbs typically play ...
متن کاملSpecifying the Role of Linguistic Information in Verb Learning
Decades of research have documented that acquiring the meanings of verbs is, on average, more difficult than acquiring the meanings of nouns. This is likely because verb learning requires more information, and information of a different kind, than noun learning. One particularly powerful source of information for verb learning is linguistic context. For example, Naigles (1990) demonstrated in a...
متن کاملOnce is Enough: N400 Indexes Semantic Integration of Novel Word Meanings from a Single Exposure in Context.
We investigated the impact of contextual constraint on the integration of novel word meanings into semantic memory. Adults read strongly or weakly constraining sentences ending in known or unknown (novel) words as scalp-recorded electrical brain activity was recorded. Word knowledge was assessed via a lexical decision task in which recently seen known and unknown word sentence endings served as...
متن کامل